# Multilingual distillation
Nllb 200 Distilled 600M En Zh CN
This is a machine translation model fine-tuned from Meta's NLLB-200-distilled-600M model, specifically designed for English-to-Simplified Chinese translation tasks.
Machine Translation
Transformers Supports Multiple Languages

N
HackerMonica
41
3
Mmlw Retrieval E5 Base
Apache-2.0
MMLW (I Must Get Better Messages) is a Polish neural text encoder optimized for information retrieval tasks, capable of converting queries and passages into 768-dimensional vectors.
Text Embedding
Transformers Other

M
sdadas
144
1
Mmlw Retrieval E5 Small
Apache-2.0
MMLW (I Must Get Better Messages) is a neural text encoder for Polish, optimized for information retrieval tasks, capable of converting queries and passages into 384-dimensional vectors.
Text Embedding
Transformers Other

M
sdadas
34
1
Distilbert Base Uk Cased
Apache-2.0
This is a Ukrainian-specific version of the distilbert-base-multilingual-cased model, retaining the original model's representational capabilities and accuracy.
Large Language Model
Transformers Other

D
Geotrend
20
2
Featured Recommended AI Models